3,211,063 research outputs found
The tropical double description method
We develop a tropical analogue of the classical double description method
allowing one to compute an internal representation (in terms of vertices) of a
polyhedron defined externally (by inequalities). The heart of the tropical
algorithm is a characterization of the extreme points of a polyhedron in terms
of a system of constraints which define it. We show that checking the
extremality of a point reduces to checking whether there is only one minimal
strongly connected component in an hypergraph. The latter problem can be solved
in almost linear time, which allows us to eliminate quickly redundant
generators. We report extensive tests (including benchmarks from an application
to static analysis) showing that the method outperforms experimentally the
previous ones by orders of magnitude. The present tools also lead to worst case
bounds which improve the ones provided by previous methods.Comment: 12 pages, prepared for the Proceedings of the Symposium on
Theoretical Aspects of Computer Science, 2010, Nancy, Franc
IDEF5 Ontology Description Capture Method: Concept Paper
The results of research towards an ontology capture method referred to as IDEF5 are presented. Viewed simply as the study of what exists in a domain, ontology is an activity that can be understood to be at work across the full range of human inquiry prompted by the persistent effort to understand the world in which it has found itself - and which it has helped to shape. In the contest of information management, ontology is the task of extracting the structure of a given engineering, manufacturing, business, or logistical domain and storing it in an usable representational medium. A key to effective integration is a system ontology that can be accessed and modified across domains and which captures common features of the overall system relevant to the goals of the disparate domains. If the focus is on information integration, then the strongest motivation for ontology comes from the need to support data sharing and function interoperability. In the correct architecture, an enterprise ontology base would allow th e construction of an integrated environment in which legacy systems appear to be open architecture integrated resources. If the focus is on system/software development, then support for the rapid acquisition of reliable systems is perhaps the strongest motivation for ontology. Finally, ontological analysis was demonstrated to be an effective first step in the construction of robust knowledge based systems
A lens-coupled scintillation counter in cryogenic environment
In this work we present an elegant solution for a scintillation counter to be
integrated into a cryogenic system. Its distinguishing feature is the absence
of a continuous light guide coupling the scintillation and the photodetector
parts, operating at cryogenic and room temperatures respectively. The prototype
detector consists of a plastic scintillator with glued-in wavelength-shifting
fiber located inside a cryostat, a Geiger-mode Avalanche Photodiode (G-APD)
outside the cryostat, and a lens system guiding the scintillation light
re-emitted by the fiber to the G-APD through optical windows in the cryostat
shields. With a 0.8mm diameter multiclad fiber and a 1mm active area G-APD the
coupling efficiency of the "lens light guide" is about 50%. A reliable
performance of the detector down to 3K is demonstrated.Comment: 14 pages, 11 figure
Fracture driven by a Thermal Gradient
Motivated by recent experiments by Yuse and Sano (Nature, 362, 329 (1993)),
we propose a discrete model of linear springs for studying fracture in thin and
elastically isotropic brittle films. The method enables us to draw a map of the
stresses in the material. Cracks generated by the model, imposing a moving
thermal gradient in the material, can branch or wiggle depending on the driving
parameters. The results may be used to compare with other recent theoretical
work, or to design future experiments.Comment: RevTeX file (9 pages) and 5 postscript figure
Optimizing the double description method for normal surface enumeration
Many key algorithms in 3-manifold topology involve the enumeration of normal
surfaces, which is based upon the double description method for finding the
vertices of a convex polytope. Typically we are only interested in a small
subset of these vertices, thus opening the way for substantial optimization.
Here we give an account of the vertex enumeration problem as it applies to
normal surfaces, and present new optimizations that yield strong improvements
in both running time and memory consumption. The resulting algorithms are
tested using the freely available software package Regina.Comment: 27 pages, 12 figures; v2: Removed the 3^n bound from Section 3.3,
fixed the projective equation in Lemma 4.4, clarified "most triangulations"
in the introduction to section 5; v3: replace -ise with -ize for Mathematics
of Computation (note that this changes the title of the paper
Verifying Real-Time Systems using Explicit-time Description Methods
Timed model checking has been extensively researched in recent years. Many
new formalisms with time extensions and tools based on them have been
presented. On the other hand, Explicit-Time Description Methods aim to verify
real-time systems with general untimed model checkers. Lamport presented an
explicit-time description method using a clock-ticking process (Tick) to
simulate the passage of time together with a group of global variables for time
requirements. This paper proposes a new explicit-time description method with
no reliance on global variables. Instead, it uses rendezvous synchronization
steps between the Tick process and each system process to simulate time. This
new method achieves better modularity and facilitates usage of more complex
timing constraints. The two explicit-time description methods are implemented
in DIVINE, a well-known distributed-memory model checker. Preliminary
experiment results show that our new method, with better modularity, is
comparable to Lamport's method with respect to time and memory efficiency
Theoretical description of deformed proton emitters: nonadiabatic coupled-channel method
The newly developed nonadiabatic method based on the coupled-channel
Schroedinger equation with Gamow states is used to study the phenomenon of
proton radioactivity. The new method, adopting the weak coupling regime of the
particle-plus-rotor model, allows for the inclusion of excitations in the
daughter nucleus. This can lead to rather different predictions for lifetimes
and branching ratios as compared to the standard adiabatic approximation
corresponding to the strong coupling scheme. Calculations are performed for
several experimentally seen, non-spherical nuclei beyond the proton dripline.
By comparing theory and experiment, we are able to characterize the angular
momentum content of the observed narrow resonance.Comment: 12 pages including 10 figure
Microscopic description of light unstable nuclei with the stochastic variational method
The structure of the light proton and neutron rich nuclei is studied in a
microscopic multicluster model using the stochastic variational method. This
approach enables us to describe the weakly bound nature of these nuclei in a
consistent way. Applications for various nuclei Li, Be, B,
C, Be, B presented. The paper discusses the relation of
this model to other models as well as the possible extension for p and sd shell
nuclei.Comment: 11 pages, latex, no figures
- …